Goto

Collaborating Authors

 video interview


Integrating Psychometrics and Computing Perspectives on Bias and Fairness in Affective Computing: A Case Study of Automated Video Interviews

Booth, Brandon M, Hickman, Louis, Subburaj, Shree Krishna, Tay, Louis, Woo, Sang Eun, DMello, Sidney K.

arXiv.org Artificial Intelligence

We provide a psychometric-grounded exposition of bias and fairness as applied to a typical machine learning pipeline for affective computing. We expand on an interpersonal communication framework to elucidate how to identify sources of bias that may arise in the process of inferring human emotions and other psychological constructs from observed behavior. Various methods and metrics for measuring fairness and bias are discussed along with pertinent implications within the United States legal context. We illustrate how to measure some types of bias and fairness in a case study involving automatic personality and hireability inference from multimodal data collected in video interviews for mock job applications. We encourage affective computing researchers and practitioners to encapsulate bias and fairness in their research processes and products and to consider their role, agency, and responsibility in promoting equitable and just systems. Personal use of this material is permitted. The tools used in affective computing (AC), which enable machines to identify people's behaviors and mental states, are being increasingly utilized in education, healthcare, and the workplace. One application is to aid in the allocation of limited resources (e.g., counseling, mental health care, in-person interviews) via automated screening [1-3]. In these types of high-stakes scenarios, the assessments provided by AC systems can directly affect the decision processes which influence the amount of attention, care, and opportunities afforded to individuals. As such, it is important that these processes are accurate, unbiased, and fair because any deficiencies or errors present in these systems stemming from the data they were trained on, the types of algorithms used, or the decision processes themselves, may disproportionately impact different groups of people and lead to ethical and legal concerns, not to mention pain and suffering for the vulnerable groups impacted. Simply put, AC systems must deter, not propagate, extant systems of inequity and injustice. Fortunately, we have decades of guidance on how to construct fair and unbiased measurement systems.


New York City's Law on Using Tech to Make Hiring Decisions Keeps Getting Weaker

Slate

After years of building experience, developing your knowledge, and honing your skills, you are finally ready to apply for your dream job. But by the time you find out there's an opening and gather your application materials, the position has been filled. The company had recruited candidates using targeted ads on social media and career-oriented websites--ads that you never saw for reasons that are unclear to you. You then apply to another employer, where a human recruiter is impressed by your resume and advances you to the interview stage. But this time, you're rejected after an awkward recorded video interview in which you answered questions read by a computer.


Personality Detection of Applicants And Employees Using K-mode Algorithm And Ocean Model

Mohan, Binisha, Joseph, Dinju Vattavayalil, Subhash, Bharat Plavelil

arXiv.org Artificial Intelligence

The combination of conduct, emotion, motivation, and thinking is referred to as personality. To shortlist candidates more effectively, many organizations rely on personality predictions. The firm can hire or pick the best candidate for the desired job description by grouping applicants based on the necessary personality preferences. A model is created to identify applicants' personality types so that employers may find qualified candidates by examining a person's facial expression, speech intonation, and resume. Additionally, the paper emphasises detecting the changes in employee behaviour. Employee attitudes and behaviour towards each set of questions are being examined and analysed. Here, the K-Modes clustering method is used to predict employee well-being, including job pressure, the working environment, and relationships with peers, utilizing the OCEAN Model and the CNN algorithm in the AVI-AI administrative system. Findings imply that AVIs can be used for efficient candidate screening with an AI decision agent. The study of the specific field is beyond the current explorations and needed to be expanded with deeper models and new configurations that can patch extremely complex operations.


Woke AI that claims to help firms improve diversity discriminates against candidates for home decor

Daily Mail - Science & tech

While most job interviews were once face-to-face affairs, during the Covid-19 pandemic there was a surge in the number of interviews taking place online. Amid this rise, many companies started using AI tools to sift through candidates before they were interviewed by a human. These tools are marketed as unbiased against gender and ethnicity, with developers claiming they can help to improve diversity in the workplace. However, a new study has warned that using AI in hiring is little better than'automated pseudoscience'. Researchers from the University of Cambridge found that.


Staff Software Engineer, Machine Learning - Remote Tech Jobs

#artificialintelligence

We're a team that is connected by time. Life has taught us its true value and finite nature. We value every minute and are on a mission to return time. And we live and breathe that mission in everything we do -- from how we build our product that saves our customers time to how we operate as a company. Come work with a team that's intelligent yet humble, visionary yet gets things done.


A university professor wants to expose the hidden bias in AI, and then use it for good

#artificialintelligence

Lauren Rhue researches the fast-paced world of artificial intelligence and machine learning technology. But she wants everyone in it to slow down. Rhue, an assistant professor of information systems at the University of Maryland Robert H. Smith School of Business, recently audited emotion recognition technology within three facial recognition services: Amazon Rekognition, Face and Microsoft. Her research revealed what Rhue called "really stark" racial disparities. Amazon Rekognition is offered for use to other companies.


Robots as recruiters: Can artificial intelligence hire the right people?

#artificialintelligence

Recruitment is one of the core responsibilities of the HR department, with the hiring team being composed of individuals with the skills to conduct the vetting process. These individuals interact with candidates to ensure that they are the fit for the role. But what happens when the "human" aspect of recruitment is removed entirely? Can artificial intelligence replace humans and still hire the right people? AI, in the form of machine learning, plays a huge role in hiring people.


New AI tools let you chat with your dead relatives

#artificialintelligence

New products that let people keep relatives "alive" via AI are proliferating -- offering, say, an interactive conversation with a recently departed dad who took the time to record a video interview before he passed. Why it matters: As interest in genealogy and ancestry proliferates, these tools let families preserve memories and personal connections through generations -- even giving children a sense of the physical presence of a relative who died before they were born. One such tool, StoryFile, was notably used at the late actor Ed Asner's memorial service, where mourners were invited to "converse" with the deceased at an interactive display that featured video and audio he recorded over several days before he died. At Asner's memorial, "many people just stopped by and asked a question or a couple questions," including Jason Alexander of "Seinfeld" fame, said Matt Asner, a TV and movie producer who now runs the Ed Asner Family Center, a nonprofit for people with special needs. The big picture: StoryFile is perhaps the most robust of a growing number of tools that help people create interactive digital memories of relatives.


Your next job interview could be with a robot

#artificialintelligence

A growing number of companies are using chat bots and AI-led video interviews to assess job candidates before a human recruiter even meets them. Why it matters: Automated interviews vastly expand the job candidate pool and are designed to ensure consistent hiring practices by rooting out ways that bias seeps into interviews, recruiters say. But job applicants complain they're dehumanizing and stressful. The big picture: Recruiters have been using artificial intelligence for a while to automate candidate searches or screen resumes, for example. AI-led video interviews, however, go beyond those practices -- because candidates are assessed by a computer algorithm.


Is AI the Future of Recruitment? - ETHRWorld

#artificialintelligence

According to experts, the first thing that should be kept in mind is that AI should only assist humans in efficient decision making instead of making decisions on its own. By Tejaswini Singhal Things were simpler in the Black and White era of the Human Resource procedures and recruitment than they are today. The reason for this is that the demands, challenges, and options available to HR managers back then were very different from those available today. The world of recruitment has changed as artificial intelligence (AI) is quickly becoming a must-have tool in every recruiter's toolbox. Skillsets have changed and are constantly changing.